Shape-Constrained Symbolic Regression with NSGA-III

نویسندگان

چکیده

Shape-constrained symbolic regression (SCSR) allows to include prior knowledge into data-based modeling. This inclusion ensure that certain expected behavior is better reflected by the resulting models. The defined via constraints, which refer function form e.g. monotonicity, concavity, convexity or models image boundaries. In addition advantage of obtaining more robust and reliable due defining constraints over functions shape, use SCSR find are noise have a extrapolation behavior. paper presents mutlicriterial approach minimize approximation error as well constraint violations. Explicitly two algorithms NSGA-II NSGA-III implemented compared against each other in terms model quality runtime. Both capable dealing with multiple objectives, whereas established multi-objective performing on instances up-to 3 objectives. an extension algorithm was developed handle problems "many" objectives (more than objectives). executed selected set benchmark from physics textbooks. results indicate both able largely feasible solutions provides slight improvements quality. Moreover, improvement runtime can be observed using many-objective approach.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dimensionally Constrained Symbolic Regression

We describe dimensionally constrained symbolic regression which has been developed for mass measurement in certain classes of events in high-energy physics (HEP). With symbolic regression, we can derive equations that are well known in HEP. However, in problems with large number of variables, we find that by constraining the terms allowed in the symbolic regression, convergence behavior is impr...

متن کامل

Constrained linear regression models for symbolic interval-valued variables

This paper introduces an approach to fitting a constrained linear regression model to interval-valued data. Each example of the learning set is described by a feature vector for which each feature value is an interval. The new approach fits a constrained linear regression model on the midpoints and range of the interval values assumed by the variables in the learning set. The prediction of the ...

متن کامل

Symbolic Regression Algorithms with Built-in Linear Regression

Recently, several algorithms for symbolic regression (SR) emerged which employ a form of multiple linear regression (LR) to produce generalized linear models. The use of LR allows the algorithms to create models with relatively small error right from the beginning of the search; such algorithms are thus claimed to be (sometimes by orders of magnitude) faster than SR algorithms based on vanilla ...

متن کامل

Sequential Symbolic Regression with Genetic Programming

This chapter describes the Sequential Symbolic Regression (SSR) method, a new strategy for function approximation in symbolic regression. The SSR method is inspired by the sequential covering strategy from machine learning, but instead of sequentially reducing the size of the problem being solved, it sequentially transforms the original problem into potentially simpler problems. This transforma...

متن کامل

Knowledge Discovery through Symbolic Regression with HeuristicLab

This contribution describes how symbolic regression can be used for knowledge discovery with the open-source software HeuristicLab. HeuristicLab includes a large set of algorithms and problems for combinatorial optimization and for regression and classification, including symbolic regression with genetic programming. It provides a rich GUI to analyze and compare algorithms and identified models...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2022

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-25312-6_19